Gibbs Sampling Methods for Stick-Breaking Priors
نویسنده
چکیده
A rich and exible class of random probability measures, which we call stick-breaking priors, can be constructed using a sequence of independent beta random variables. Examples of random measures that have this characterization include the Dirichlet process, its two-parameter extension, the two-parameter Poisson–Dirichlet process, nite dimensional Dirichlet priors, and beta two-parameter processes. The rich nature of stick-breaking priors offers Bayesians a useful class of priors for nonparametri c problems, while the similar construction used in each prior can be exploited to develop a general computational procedure for tting them. In this article we present two general types of Gibbs samplers that can be used to t posteriors of Bayesian hierarchical models based on stick-breaking priors. The rst type of Gibbs sampler, referred to as a Pólya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling method currently employed for Dirichlet process computing. This method applies to stick-breaking priors with a known Pólya urn characterization, that is, priors with an explicit and simple prediction rule. Our second method, the blocked Gibbs sampler, is based on an entirely different approach that works by directly sampling values from the posterior of the random measure. The blocked Gibbs sampler can be viewed as a more general approach because it works without requiring an explicit prediction rule. We nd that the blocked Gibbs avoids some of the limitations seen with the Pólya urn approach and should be simpler for nonexperts to use.
منابع مشابه
Some Further Developments for Stick-breaking Priors: Finite and Infinite Clustering and Classification
SUMMARY. The class of stick-breaking priors and their extensions are considered in classification and clustering problems in which the complexity, the number of possible models or clusters, can be either bounded or unbounded. A conjugacy property for the ex tended stick-breaking prior is established which allows for informative characterizations of the priors under i.i.d. sampling, and which fu...
متن کاملBayesian nonparametric regression with varying residual density.
We consider the problem of robust Bayesian inference on the mean regression function allowing the residual density to change flexibly with predictors. The proposed class of models is based on a Gaussian process prior for the mean regression function and mixtures of Gaussians for the collection of residual densities indexed by predictors. Initially considering the homoscedastic case, we propose ...
متن کاملStick-breaking Construction for the Indian Buffet Process
The Indian buffet process (IBP) is a Bayesian nonparametric distribution whereby objects are modelled using an unbounded number of latent features. In this paper we derive a stick-breaking representation for the IBP. Based on this new representation, we develop slice samplers for the IBP that are efficient, easy to implement and are more generally applicable than the currently available Gibbs s...
متن کاملGibbs Sampling for (Coupled) Infinite Mixture Models in the Stick Breaking Representation
Nonparametric Bayesian approaches to clustering, information retrieval, language modeling and object recognition have recently shown great promise as a new paradigm for unsupervised data analysis. Most contributions have focused on the Dirichlet process mixture models or extensions thereof for which efficient Gibbs samplers exist. In this paper we explore Gibbs samplers for infinite complexity ...
متن کاملVariational Inference for Sequential Distance Dependent Chinese Restaurant Process
Recently proposed distance dependent Chinese Restaurant Process (ddCRP) generalizes extensively used Chinese Restaurant Process (CRP) by accounting for dependencies between data points. Its posterior is intractable and so far only MCMC methods were used for inference. Because of very different nature of ddCRP no prior developments in variational methods for Bayesian nonparametrics are appliable...
متن کامل